Search results for "Gini index of heterogeneity"

showing 1 items of 1 documents

Extropy: Complementary Dual of Entropy

2015

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…

Bregman divergenceFOS: Computer and information sciencesStatistics and ProbabilitySettore MAT/06 - Probabilita' E Statistica MatematicaKullback–Leibler divergenceComputer Science - Information TheoryGeneral MathematicsFOS: Physical sciencesBinary numberMathematics - Statistics TheoryStatistics Theory (math.ST)Kullback–Leibler divergenceBregman divergenceproper scoring rulesGini index of heterogeneityDifferential entropyBinary entropy functionFOS: MathematicsEntropy (information theory)Statistical physicsDual functionAxiomMathematicsdifferential and relative entropy/extropy Kullback- Leibler divergence Bregman divergence duality proper scoring rules Gini index of heterogeneity repeat rate.Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniDifferential and relative entropy/extropyInformation Theory (cs.IT)Probability (math.PR)repeat ratePhysics - Data Analysis Statistics and ProbabilitydualityStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMathematics - ProbabilityData Analysis Statistics and Probability (physics.data-an)Statistical Science
researchProduct